7 research outputs found

    Optimal Eigenvalue Approximation via Sketching

    Full text link
    Given a symmetric matrix AA, we show from the simple sketch GAGTGAG^T, where GG is a Gaussian matrix with k=O(1/ϵ2)k = O(1/\epsilon^2) rows, that there is a procedure for approximating all eigenvalues of AA simultaneously to within ϵAF\epsilon \|A\|_F additive error with large probability. Unlike the work of (Andoni, Nguyen, SODA, 2013), we do not require that AA is positive semidefinite and therefore we can recover sign information about the spectrum as well. Our result also significantly improves upon the sketching dimension of recent work for this problem (Needell, Swartworth, Woodruff FOCS 2022), and in fact gives optimal sketching dimension. Our proof develops new properties of singular values of GAGA for a k×nk \times n Gaussian matrix GG and an n×nn \times n matrix AA which may be of independent interest. Additionally we achieve tight bounds in terms of matrix-vector queries. Our sketch can be computed using O(1/ϵ2)O(1/\epsilon^2) matrix-vector multiplies, and by improving on lower bounds for the so-called rank estimation problem, we show that this number is optimal even for adaptive matrix-vector queries

    Testing Positive Semidefiniteness Using Linear Measurements

    Full text link
    We study the problem of testing whether a symmetric d×dd \times d input matrix AA is symmetric positive semidefinite (PSD), or is ϵ\epsilon-far from the PSD cone, meaning that λmin(A)ϵAp\lambda_{\min}(A) \leq - \epsilon \|A\|_p, where Ap\|A\|_p is the Schatten-pp norm of AA. In applications one often needs to quickly tell if an input matrix is PSD, and a small distance from the PSD cone may be tolerable. We consider two well-studied query models for measuring efficiency, namely, the matrix-vector and vector-matrix-vector query models. We first consider one-sided testers, which are testers that correctly classify any PSD input, but may fail on a non-PSD input with a tiny failure probability. Up to logarithmic factors, in the matrix-vector query model we show a tight Θ~(1/ϵp/(2p+1))\widetilde{\Theta}(1/\epsilon^{p/(2p+1)}) bound, while in the vector-matrix-vector query model we show a tight Θ~(d11/p/ϵ)\widetilde{\Theta}(d^{1-1/p}/\epsilon) bound, for every p1p \geq 1. We also show a strong separation between one-sided and two-sided testers in the vector-matrix-vector model, where a two-sided tester can fail on both PSD and non-PSD inputs with a tiny failure probability. In particular, for the important case of the Frobenius norm, we show that any one-sided tester requires Ω~(d/ϵ)\widetilde{\Omega}(\sqrt{d}/\epsilon) queries. However we introduce a bilinear sketch for two-sided testing from which we construct a Frobenius norm tester achieving the optimal O~(1/ϵ2)\widetilde{O}(1/\epsilon^2) queries. We also give a number of additional separations between adaptive and non-adaptive testers. Our techniques have implications beyond testing, providing new methods to approximate the spectrum of a matrix with Frobenius norm error using dimensionality reduction in a way that preserves the signs of eigenvalues

    Testing Hereditary Properties of Sequences

    Get PDF
    A hereditary property of a sequence is one that is preserved when restricting to subsequences. We show that there exist hereditary properties of sequences that cannot be tested with sublinear queries, resolving an open question posed by Newman et al. This proof relies crucially on an infinite alphabet, however; for finite alphabets, we observe that any hereditary property can be tested with a constant number of queries
    corecore